16,081 research outputs found

    ESTIMATING THE ECONOMIC LOSSES FROM DISEASES AND EXTENDED DAYS OPEN WITH A FARM-LEVEL STOCHASTIC MODEL

    Get PDF
    This thesis improved a farm-level stochastic model with Monte Carlo simulation to estimate the impact of health performance and market conditions on dairy farm economics. The main objective of this model was to estimate the costs of seven common clinical dairy diseases (mastitis, lameness, metritis, retained placenta, left displaced abomasum, ketosis, and milk fever) in the U.S. An online survey was conducted to estimate veterinary fees, treatment costs, and producer labor data. The total disease costs were higher in multiparous cows than in primiparous cows. Left displaced abomasum had the greatest costs in all parities (404.74inprimiparouscowsand404.74 in primiparous cows and 555.79 in multiparous cows). Milk loss, treatment costs, and culling costs were the largest three cost categories for all diseases. A secondary objective of this model was to evaluate the dairy cow’s value, the optimal culling decision, and the cost of days open with flexible model inputs. Dairy cow value under 2013 market conditions was lower than previous studies due to the high slaughter and feed price and low replacement price. The first optimal replacement moment appeared in the middle of the first parity. Furthermore, the cost of days open was considerably influenced by the market conditions

    Optimizing Lossy Compression Rate-Distortion from Automatic Online Selection between SZ and ZFP

    Full text link
    With ever-increasing volumes of scientific data produced by HPC applications, significantly reducing data size is critical because of limited capacity of storage space and potential bottlenecks on I/O or networks in writing/reading or transferring data. SZ and ZFP are the two leading lossy compressors available to compress scientific data sets. However, their performance is not consistent across different data sets and across different fields of some data sets: for some fields SZ provides better compression performance, while other fields are better compressed with ZFP. This situation raises the need for an automatic online (during compression) selection between SZ and ZFP, with a minimal overhead. In this paper, the automatic selection optimizes the rate-distortion, an important statistical quality metric based on the signal-to-noise ratio. To optimize for rate-distortion, we investigate the principles of SZ and ZFP. We then propose an efficient online, low-overhead selection algorithm that predicts the compression quality accurately for two compressors in early processing stages and selects the best-fit compressor for each data field. We implement the selection algorithm into an open-source library, and we evaluate the effectiveness of our proposed solution against plain SZ and ZFP in a parallel environment with 1,024 cores. Evaluation results on three data sets representing about 100 fields show that our selection algorithm improves the compression ratio up to 70% with the same level of data distortion because of very accurate selection (around 99%) of the best-fit compressor, with little overhead (less than 7% in the experiments).Comment: 14 pages, 9 figures, first revisio

    Study on space-time structure of Higgs boson decay using HBT correlation Method in e+^+e−^- collision at s\sqrt{s}=250 GeV

    Full text link
    The space-time structure of the Higgs boson decay are carefully studied with the HBT correlation method using e+^+e−^- collision events produced through Monte Carlo generator PYTHIA 8.2 at s\sqrt{s}=250GeV. The Higgs boson jets (Higgs-jets) are identified by H-tag tracing. The measurement of the Higgs boson radius and decay lifetime are derived from HBT correlation of its decay final state pions inside Higgs-jets in the e+^+e−^- collisions events with an upper bound of RH≤1.03±0.05R_H \le 1.03\pm 0.05 fm and τH≤(1.29±0.15)×10−7\tau_H \le (1.29\pm0.15)\times 10^{-7} fs. This result is consistent with CMS data.Comment: 7 pages,3 figure

    Characterizing and Modeling the Dynamics of Activity and Popularity

    Full text link
    Social media, regarded as two-layer networks consisting of users and items, turn out to be the most important channels for access to massive information in the era of Web 2.0. The dynamics of human activity and item popularity is a crucial issue in social media networks. In this paper, by analyzing the growth of user activity and item popularity in four empirical social media networks, i.e., Amazon, Flickr, Delicious and Wikipedia, it is found that cross links between users and items are more likely to be created by active users and to be acquired by popular items, where user activity and item popularity are measured by the number of cross links associated with users and items. This indicates that users generally trace popular items, overall. However, it is found that the inactive users more severely trace popular items than the active users. Inspired by empirical analysis, we propose an evolving model for such networks, in which the evolution is driven only by two-step random walk. Numerical experiments verified that the model can qualitatively reproduce the distributions of user activity and item popularity observed in empirical networks. These results might shed light on the understandings of micro dynamics of activity and popularity in social media networks.Comment: 13 pages, 6 figures, 2 table
    • …
    corecore